Real-Parameter Black-Box Optimization Benchmarking 2009: Experimental Setup

نویسندگان

  • Nikolaus Hansen
  • Anne Auger
  • Steffen Finck
  • Raymond Ros
چکیده

Quantifying and comparing performance of optimization algorithms is one important aspect of research in search and optimization. However, this task turns out to be tedious and difficult to realize even in the single-objective case – at least if one is willing to accomplish it in a scientifically decent and rigorous way. The BBOB 2009 workshop will furnish most of this tedious task for its participants: (1) choice and implementation of a well-motivated single-objective benchmark function testbed, (2) design of an experimental set-up, (3) generation of data output for (4) post-processing and presentation of the results in graphs and tables. What remains to be done for the participants is to allocate CPU-time, run their favorite black-box real-parameter optimizer in a few dimensions a few hundreds of times and execute the provided post-processing script afterwards. Two testbeds are provided, • noise-free functions • noisy functions The participants can freely choose any or all of them. During the workshop the overall procedure will be critically reviewed, the algorithms will be presented by the participants, quantitative performance measurements of all submitted algorithms will be presented, categorized by early and late performance and function properties like multimodality, ill-conditioning, symmetry, ridge-solving, coarseand fine-grain ruggedness, weak global structure, outlier noise... This document, the benchmark function definitions and source code of the benchmark functions and for the post-processing are available at http://coco.gforge.inria.fr/doku.php?id=bbob-2009. ∗NH is with the Microsoft Research–INRIA Joint Centre, 28 rue Jean Rostand, 91893 Orsay Cedex, France †AA is with the TAO Team, INRIA Saclay, Université Paris Sud, LRI, 91405 Orsay cedex, France ‡SF is with the Research Center PPE, University of Applied Sciene Vorarlberg, Hochschulstrasse 1, 6850 Dornbirn, Austria §RR is with the Univ. Paris-Sud, LRI, UMR 8623 / INRIA Saclay, projet TAO, F-91405 Orsay, France.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

COCO: The Experimental Procedure

We present a budget-free experimental setup and procedure for benchmarking numerical optimization algorithms in a black-box scenario. This procedure can be applied with the COCO benchmarking platform. We describe initialization of and input to the algorithm and touch upon the relevance of termination and restarts.

متن کامل

BMOBench: Black-Box Multi-Objective Optimization Benchmarking Platform

This document briefly describes the Black-Box Multi-Objective Optimization Benchmarking (BMOBench) platform. It presents the test problems, evaluation procedure, and experimental setup. To this end, the BMOBench is demonstrated by comparing recent multi-objective solvers from the literature, namely SMS-EMOA (Beume et al., 2007), DMS (Custódio et al., 2011), and MO-SOO (Al-Dujaili and Suresh, 20...

متن کامل

Real-Parameter Black-Box Optimization Benchmarking: Experimental Setup

Quantifying and comparing performance of numerical optimization algorithms is an important aspect of research in search and optimization. However, this task turns out to be tedious and difficult to realize even in the single-objective case – at least if one is willing to accomplish it in a scientifically decent and rigorous way. The COCO software used for the BBOB workshops (2009, 2010 and 2012...

متن کامل

Real-Parameter Black-Box Optimization Benchmarking BBOB-2010: Experimental Setup

Quantifying and comparing performance of numerical optimization algorithms is one important aspect of research in search and optimization. However, this task turns out to be tedious and difficult to realize even in the single-objective case – at least if one is willing to accomplish it in a scientifically decent and rigorous way. The BBOB 2010 workshop will furnish most of this tedious task for...

متن کامل

Benchmarking Parameter-Free AMaLGaM on Functions With and Without Noise

We describe a parameter-free estimation-of-distribution algorithm (EDA) called the adapted maximum-likelihood Gaussian model iterated density-estimation evolutionary algorithm (AMaLGaM-ID[Formula: see text]A, or AMaLGaM for short) for numerical optimization. AMaLGaM is benchmarked within the 2009 black box optimization benchmarking (BBOB) framework and compared to a variant with incremental mod...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009